Back

Pharmacoepidemiology and Drug Safety

Wiley

All preprints, ranked by how well they match Pharmacoepidemiology and Drug Safety's content profile, based on 12 papers previously published here. The average preprint has a 0.06% match score for this journal, so anything above that is already an above-average fit. Older preprints may already have been published elsewhere.

1
An assessment of evidence to inform best practice for the communication of acute venous thromboembolism diagnosis: a scoping review

Mishra, S.; Klok, F. A.; Le Gal, G.; de Wit, K.; Schwartz, A.; Luijten, D.; Sadeghipour, P.; Bayley, J.; Woller, S.

2024-10-30 hematology 10.1101/2024.10.29.24316375
Top 0.1%
230× avg
Show abstract

BackgroundPhysician communication with patients is a key aspect of excellent care. Scant evidence exists to inform best practice for physician communication in patients diagnosed with pulmonary embolism and deep vein thrombosis, collectively referred to as venous thromboembolism (VTE). The aim of this study was to summarize the existing literature on best practices for communication between healthcare providers and patients newly diagnosed with VTE. MethodsWe performed a scoping review of the extant literature on best practice for physician patient communication and the diagnosis and management of VTE. Manuscripts on communication between healthcare professionals and patients with acute vascular diseases, including VTE, were eligible. Two authors independently reviewed titles, and consensus determined article inclusion. The manuscripts were further categorized into two main categories: best practice in communication and unmet needs in communication. Data aggregation was achieved by a modified thematic synthesis. ResultsAmong 345 initial publications, 22 manuscripts met inclusion criteria with 11 that addressed VTE, five pulmonary embolism, four deep vein thrombosis, one atrial fibrillation, and one acute coronary syndrome. Eleven manuscripts addressed communication of VTE diagnosis, while 12 focused on communication of VTE treatment. Eleven manuscripts identified unmet communication needs, and 14 addressed best practice. Our review shows that good communication surrounding the VTE diagnosis and treatment can enhance satisfaction while suboptimal communication can incur emotional, cognitive, behavioral, social, and health-systems adverse effects. ConclusionScant literature guides best practices for communicating VTE diagnosis and treatment. Further research is necessary to establish practices for improving communication with VTE patients.

2
Clinical Outcomes of Hospitalized Patients with COVID-19 on Therapeutic Anticoagulants

Patel, N. G.; Bhasin, A.; Feinglass, J. M.; Belknap, S. M.; Angarone, M. P.; Cohen, E. R.; Barsuk, J. H.

2020-08-26 hematology 10.1101/2020.08.22.20179911
Top 0.1%
217× avg
Show abstract

BackgroundCOVID-19 is associated with hypercoagulability and an increased incidence of thrombosis. We evaluated the clinical outcomes of adults hospitalized with COVID-19 who either continued therapeutic anticoagulants previously prescribed or who were newly started on anticoagulants during hospitalization. MethodsWe performed an observational study of adult inpatients with COVID-19 at 10 hospitals affiliated with Northwestern Medicine in the Chicagoland area from March 9 to June 26, 2020. We evaluated clinical outcomes of subjects with COVID-19 who were continued on their outpatient therapeutic anticoagulation during hospitalization and those who were newly started on these medications compared to those who were on prophylactic doses of these medications based on the World Health Organization (WHO) Ordinal Scale for Clinical Improvement. The primary outcome was overall death while secondary outcomes were critical illness (WHO score [≥]5), need for mechanical ventilation, and death among those subjects who first had critical illness adjusted for age, sex, race, body mass index (BMI), Charlson score, glucose on admission, and use of antiplatelet agents. Results1,716 subjects with COVID-19 were included in the analysis. 171 subjects (10.0%) were continued on their outpatient therapeutic anticoagulation and 201(11.7%) were started on new therapeutic anticoagulation during hospitalization. In subjects continued on home therapeutic anticoagulation, there were no differences in overall death, critical illness, mechanical ventilation, or death among subjects with critical illness compared to subjects on prophylactic anticoagulation. Subjects receiving new therapeutic anticoagulation for COVID-19 were more likely to die (OR 5.93; 95% CI 3.71-9.47), have critical illness (OR 14.51; 95% CI 7.43-28.31), need mechanical ventilation (OR 11.22; 95% CI 6.67-18.86), and die after first having critical illness. (OR 5.51; 95% CI 2.80 -10.87). ConclusionsContinuation of outpatient prescribed anticoagulant was not associated with improved clinical outcomes. Therapeutic anticoagulation for COVID-19 in absence of other indications was associated with worse clinical outcomes.

3
CohortSymmetry: An R package to perform sequence symmetry analysis using the OMOP common data model

Chen, X.; Stanford, T.; Guo, Y.; Raventos, B.; Du, M.; Li, X.; Lam, A.; Corby, G.; Mercade-Besora, N.; Alcalde Herraiz, M.; Lopez-Guell, K.; Delmestri, A.; Man, W. Y.; PRIETO-ALHAMBRA, D.; Burn, E.; Catala, M.; Pratt, N.; Jodicke, A.; Newby, D.

2025-11-17 pharmacology and therapeutics 10.1101/2025.11.14.25340229
Top 0.1%
205× avg
Show abstract

BackgroundReal-world data are valuable for detecting adverse drug events, and Sequence Symmetry Analysis (SSA) is a simple yet effective method frequently used for this purpose. However, heterogeneous implementations across studies limit reproducibility and scalability. To address this, we developed an open-source R package that standardises SSA analytics using data mapped to the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM). MethodsWe developed CohortSymmetry, an R package that implements SSA for OMOP CDM data. The package was validated through unit testing and evaluated empirically by estimating adjusted sequence ratios (ASRs) with 95% confidence intervals (CIs) for 23 positive and 10 negative controls across six European databases, including CPRD GOLD (UK) and THIN(R) (Belgium, Italy, Romania, Spain, UK). Sensitivity and specificity were defined as the proportions of positive and negative controls correctly identified by SSA. Sensitivity analyses varied key parameters, including the washout period. ResultsCohortSymmetry passed high-coverage unit tests. Of 33 eligible controls, four showed results consistent with expectations across all databases; for example, the amiodarone-levothyroxine pair had a lower 95% CI bound >1 in each. Sensitivity was moderate, whereas specificity was high in the primary analyses. Parameter variation influenced outcomes; a 365-day prior observation requirement reduced specificity in CPRD GOLD from 75% to 38%. ConclusionsCohortSymmetry enables reproducible SSA using OMOP CDM data. Differences across databases likely reflect heterogeneity in data capture and prescribing patterns. Limitations include residual data variability and SSAs susceptibility to time-varying confounding, underscoring the need for tailored analytic design in pharmacovigilance studies. Key MessagesO_LIWe developed CohortSymmetry, an open-source R package that standardises SSA analytics using OMOP CDM-mapped data and verified the correctness of functions via unit testing and application to real-world datasets. C_LIO_LICohortSymmetry passed high-coverage tests, and among 33 selected controls, four showed results consistent with expectations across all databases; varying analytical parameters affected results. C_LIO_LIThe package provides a reproducible and scalable framework for multi-database SSA studies, supporting robust pharmacovigilance, but careful specification of parameters is required to account for the characteristics of the medical domain under investigation. C_LI

4
Cardiovascular and Thromboembolic Safety Signals Associated with Non-Steroidal Anti-Inflammatory Drugs Initiation: A Sequence Symmetry Analysis

Bobba, S. S.; Rowlands, E. J.; Delmestri, A.; Man, W. Y.; Chen, X.; Li, X.; Saura-Lazaro, A.; PRIETO-ALHAMBRA, D.; Newby, D.

2025-11-27 epidemiology 10.1101/2025.11.26.25341008
Top 0.1%
204× avg
Show abstract

Background and aimsNon-steroidal anti-inflammatory drugs (NSAIDs) are widely used, yet real-world evidence on cardiovascular (CV) and venous thromboembolic (VTE) risks for individual agents is limited. We aimed to identify CV and VTE safety signals associated with NSAID initiation. MethodsWe conducted a sequence symmetry analysis including patients aged 18+ with one year of prior observation initiating NSAIDs between 2013 to 2023 using CPRD GOLD. Patients had a CV [myocardial infarction (MI), arrythmia, heart failure, haemorrhagic/ischemic stroke] or VTE [deep vein thrombosis (DVT), pulmonary embolism (PE)] within {+/-}180 days of NSAID initiation. Adjusted sequence ratios (ASR) with 95% confidence intervals were calculated. Analyses were stratified by sex, age (18-65 and 65+), proton pump inhibitor use and different initiation windows (90 and 365 days). ResultsThere were 19,383 patients with an NSAID prescription and CV or VTE event (median age 66 [IQR 54 - 76] years, 54% male). Naproxen showed positive signals across all CV/VTE events, with highest ASR shown for PE (ASR 3.03 [95% CI 2.63-3.51]). Ibuprofen showed signals across six events, with PE the highest (ASR 2.2 [1.88-2.59]). Diclofenac and etoricoxib showed positive signals for five events with MI (ASR 3.30 [2.42-4.57]) and stroke (3.68 [2.14-6.62]) being the highest. Celecoxib and meloxicam showed positive signals across four events, with heart failure (2.15 [1.17-4.11]) and PE (2.66 [1.30-5.82]) the highest. Stratification analysis mostly aligned with the main analysis. ConclusionsIndividual NSAIDs showed variable CV and VTE signals, likely reflecting prescribing patterns and use. These findings suggest class-wide CV/VTE risks underscoring careful individual assessment when initiating therapy. Structured Graphical AbstractO_ST_ABSKey QuestionC_ST_ABSDo individual NSAIDs show safety signals for cardiovascular or venous thromboembolic events in real-world data? Key FindingInitiation of several NSAIDs showed temporal safety signals for cardiovascular and venous thromboembolic events, with positive signals observed across multiple individual NSAIDs. Take-home MessageClinicians should balance the risks and benefits of individual NSAIDs, supporting a more personalised approach to prescribing and highlighting the need for further evaluation of identified safety signals.

5
The Role Of Drug Indication On Incidence Rate Heterogeneity: A Large-Scale, Systematic Evaluation Across An International Network Of Observational Databases

Chen, H. Y.; Knoll, C.; Boventer, E.; Pratt, N.; Anand, T. V.; Van Zandt, M.; Morgan-Cooper, H.; Ryan, P.; Hripcsak, G. M.

2025-10-24 epidemiology 10.1101/2025.10.22.25338563
Top 0.1%
202× avg
Show abstract

PurposeIncidence rate estimates are sensitive to a range of factors, including age, sex, and geographical setting (data source). The magnitude of the impact of drug indication on incidence rates remains underexplored. MethodsWe conducted an observational cohort study using 13 healthcare databases to estimate the incidence rates of 73 health outcomes across 8 drug classes with multiple indications. We calculated incidence rates for each drug-outcome pair and performed random-effects meta-analyses to pool results across databases. Then, we conducted variance components analysis to find the proportions of variability attributed to database, age, sex, and indication. We reported the median of the variance components across all 73 health outcomes as a measure of the magnitude of differences across indications, age, sex, and database, per drug class. ResultsAdjusting for database, age, and sex differences, the drug classes with the highest median VC were trimethoprim (0.49), SGLT-2 inhibitors (0.26), and beta blockers (0.10), while the drug class with the lowest VC was GLP-1 agonists (<0.01). Within each drug class, and adjusting for all other factors, age was frequently the strongest contributor to incidence variation (for 5/8 drug classes, the highest class-wide median VC was the age median VC), followed by database, indication, then biological sex. ConclusionThis study showed that for some drug classes, there exists substantial variation in incidence rates estimates across indications even after accounting for heterogeneity due to age, biological sex, and data source. As many drugs have multiple indications in clinical practice, it may be important to consider drug indication when estimating incidence rates in observational studies for the purpose of patient safety evaluations.

6
Trends of drug use with suggested shortages and their alternatives across 41 real world data sources and 18 countries in Europe and North America

Pineda-Moncusi, M.; Rekkas, A.; Martinez Perez, a.; Leis, A.; Lopez Gomez, C.; Fey, E.; Bruninx, E.; Rodeiro, J.; Maljkovic, F.; Franz, M.; Mayer, M.-A.; Eleangovan, N.; Natsiavas, P.; Sen, S.; Cooper, S.; Reisberg, S.; Manlik, K.; Sanchez-Saez, F.; Pino, B. d.; Prats Uribe, A. P. U. A.; Yag?z Uresin, A.; Danilovic Bastic, A.; Rodrigues, A. M.; Palomar-Cros, A.; Verbiest, A.; Erdo?an, B.; Dinkel-Keuthage, C.; Torre, C. O.; Beukelaar, C. d.; Eteve-Pitsaer, C.; Goncalves, C. F.; Palma, C. d.; Gavina, C.; Dedman, D.; Price, D. B.; Balan, D. G.; Enders, D.; Henke, E.; Scheurwegs, E.; Callewaert, E

2024-08-29 epidemiology 10.1101/2024.08.28.24312695
Top 0.1%
201× avg
Show abstract

ImportanceDrug shortages leave affected patients in a vulnerable position. ObjectiveTo describe incidence and prevalence of use for medicines with suggested shortages in at least one European country, as announced by the European Medicines Agency, and to characterise the users of these drugs including the indication of use, duration of use, and dosage. DesignWe performed a descriptive cohort study from 2010 and up to 2024 in a network of databases which have mapped their data to the Observational Medical Outcomes Partnership (OMOP) Common Data Model (CDM). SettingSettings included primary care, secondary care, claims and various disease registries. ParticipantsWe included all patients with at least 365 days of history on the database. ExposuresAll medicines with a suggested shortage in at least one European country for more than 365 days (n=18). We also assessed their key alternatives (n=39). Main outcomes and measuresWe estimated annual incidence rates and period prevalence. A drop in incidence or prevalence of >33% after the shortage was announced was considered confirmation of a shortage. ResultsAmong 52 databases from Europe and the United States, we observed shortages according to decreased incidence of use for 8 drugs and shortages according to prevalence of use for 9 drugs. The drugs varenicline and amoxicillin alone or plus clavulanate were in shortage in the most number of countries. Conclusion and relevanceWe compiled and analysed data of annual incidence and prevalence of use plus information on patient characteristics, indication, and dose for 57 medicines among 52 databases in Europe and the United States between 2010 and 2024. We detected shortages and observed a change in the users characteristics for several drugs. We have described timely real-world scenarios of drug shortages and those unobserved in various health care settings and countries which helps to better understand how drug shortages play out in real life.

7
Comparing DOACs with warfarin in AF patients with chronic kidney disease or valvular disease: A systematic review and meta-analysis

Liang, A.; Wang, C.; Iansavichene, A.; Lazo-Langner, A.

2024-01-15 hematology 10.1101/2024.01.13.24301121
Top 0.1%
200× avg
Show abstract

ObjectiveTo analyze the safety and efficacy of different direct oral anticoagulant agents (DOACs) compared to warfarin in patients with concomitant atrial fibrillation (AF) and valvular disease or concomitant AF and chronic kidney disease (CKD). MethodsWe conducted literature searches in MEDLINE, Embase, and EBM Reviews to examine randomized-controlled trials (RCTs) and non-RCTs that included the aforementioned patient populations treated with warfarin or DOAC (rivaroxaban, dabigatran, apixaban, or edoxaban) and assessed outcomes of bleeding, stroke, or systemic/arterial thromboembolism. Meta-analysis was performed for eligible studies using the Mantel-Haenszel method random-effects model. Results3,172 studies were screened and 154 studies were selected after two levels of screening. Meta-analysis showed that, in patients with concomitant AF and CKD, DOAC was associated with reduced bleeding in non-RCTs (OR 0.65, 95% Cl [0.49, 0.86], p=0.003), particularly in more severe CKD (eGFR < 60mL/min/1.73m2). Apixaban in particular was associated with reduced bleeding (OR 0.52, 95% Cl [0.44, 0.63], p<0.00001) and stroke incidence (OR 0.60, 95% Cl [0.41, 0.87], p=0.007). In patients with concomitant AF and valvular disease, DOAC was associated with reduced bleeding (OR 0.75, 95% CI [0.57, 0.97], p=0.03) and stroke incidence (OR 0.66, 95% CI [0.47, 0.93], p=0.02) in non-RCTs. ConclusionOur study studied populations that are typically excluded from large-scale anticoagulation studies and our findings suggest that DOACs may be superior to warfarin both in the prevention of thromboembolic event and in the reduction of bleeding risks in patients with concomitant CKD or valvular disease.

8
Creating and evaluating a decision support tool to reduce critically low hemoglobin and blood transfusions in-hospital

Fralick, M.; Lee, S.; Debnath, M.; Beaton, D.; Fritz, J.; Jones, B.; Dounaevskaia, V.; Lee, Y.; Colacci, M.; Bogler, O.; Rudzicz, F.; Mamdani, M.

2025-09-14 hematology 10.1101/2025.09.12.25335172
Top 0.1%
189× avg
Show abstract

BackgroundHospitalized patients have multiple risk factors for bleeding including critical illness, interventional procedures, frequent phlebotomy, and use of anticoagulant and antiplatelet medications. It is unknown if a decision support tool to identify inpatients at highest risk of bleeding can prevent critically low hemoglobin and its associated sequelae, such as requiring blood transfusion. ObjectiveTo create a decision support tool to identify hospitalized patients at high risk of bleeding sequelae ("high risk") and to evaluate the impact of this tool on rates of critically low hemoglobin and blood transfusion. MethodsWe conducted a cohort study of patients hospitalized under general internal medicine at a tertiary-care teaching hospital in Toronto, Ontario. We defined "high risk" as patients with a hemoglobin < 86 g/L, or an absolute decrease in hemoglobin of [&ge;] 30 g/L, or a platelet count < 50 x 109/L. We then created, implemented, and prospectively evaluated a decision support tool to identify these high-risk patients and alert their clinical team. Our primary outcome was the percentage of patients who received a blood transfusion. ResultsOur retrospective pre-deployment phase included 6,401 hospitalizations and our prospective phase included 4,274 hospitalizations. The median age of patients was 67 years (IQR 52,80), 43% were female, and the median length of stay was 5 days (IQR 2,10). Overall, 9% had a hemoglobin of 70 g/L or lower and 10% received a blood transfusion in hospital. After model implementation, the median timing of alerts was 1 day after admission (IQR 1,4). The most common trigger for an alert was a hemoglobin < 86 g/L (N=624, 76%), followed by a decrease in hemoglobin [&ge;] 30 g/L (N=131, 16%), and a platelet count < 50 x 109/L (N=37, 5%). Deployment of the decision support tool was associated with a reduction in the primary outcome (OR 0.78, 95% CI 0.64, 0.96), which was driven by a reduction in red blood cell transfusion (OR 0.78, 95% CI 0.64, 0.96) as opposed to platelet transfusion (OR 1.10, 95% CI 0.46, 2.91). We also observed a reduction in our secondary outcome of critically low hemoglobin (OR 0.81, 95% CI 0.65, 0.99). InterpretationOur decision support tool was associated with a modest reduction in our primary outcome of blood transfusions and secondary outcome of critically low hemoglobin. Because our study was single-centre, we believe future larger studies are needed to validate our findings.

9
Tranexamic acid in reducing expected blood loss in moderate to low risk surgeries: systematic review, meta-analysis and cost effectiveness analysis

Jaiswal, N.; Ciminata, G.; Robinson, W.; Taylor-Rowan, M.; Morris, T.; Nevill, C.; Tahir, H.; Fisher, E.; Mulholland, R.; Lumsden, M.; Noel-Storr, A.; Davies, A.; J Cooper, N.; Quinn, T.; Sutton, A.; Wu, O.

2025-07-21 hematology 10.1101/2025.07.21.25331903
Top 0.1%
184× avg
Show abstract

Tranexamic acid (TXA) is well-established as a safe intervention for reducing transfusion requirements in surgeries with high-risk for blood loss. However, its role in surgeries classified as low-risk for blood loss remains uncertain. Given the frequency of such procedures, even small clinical benefits could have substantial cumulative impact. We assessed the clinical and cost-effectiveness of TXA in surgeries with low expected blood loss A systematic review and meta-analysis of randomised controlled trials (RCTs) for adults or children undergoing low-risk surgeries, comparing peri-operative TXA (any route or dose) with placebo or standard care informed the clinical effectiveness and a decision model adapted from NICE NG24, focusing on short-term hospital costs informed the cost-effectiveness analysis. We included 82 RCTs comprising 8506 participants. TXA significantly reduced blood loss (ratio of means 0{middle dot}73, 95% CI 0{middle dot}68,0{middle dot}79) and transfusion rates (odds ratio 0{middle dot}39, 95% CI 0{middle dot}25,0{middle dot}61). It also reduced hospital stay by 0{middle dot}4 days (MD = -0{middle dot}40 days, 95% CI = -0{middle dot}77, -0{middle dot}02) and improved pain scores at 1 and 2 weeks postoperatively. Evidence for thrombotic events was limited and inconclusive. The cost-effectiveness analysis showed TXA was cost-saving ({pound}156 per patient) and had a 99% probability of being cost-effective at the {pound}20lJ000 per QALY threshold. Reduction in bleeding and improved recovery outcomes even in surgeries with low anticipated blood loss support broader use of TXA in surgical care and suggests revisiting existing guidelines to include surgeries with any bleeding risk. Further research should examine long-term safety and patient-reported outcomes.

10
Standardisation of terminology, calculation and reporting for assigning exposure duration to drug utilisation records from healthcare data sources: the CreateDoT framework

Riera-Arnau, J.; Paoletti, O.; Gini, R.; Thurin, N. H.; Souverein, P. C.; Abtahi, S.; Duran, C. E.; Pajouheshnia, R.; Roberto, G.

2026-02-19 epidemiology 10.64898/2026.02.18.26346576
Top 0.1%
174× avg
Show abstract

BackgroundIn pharmacoepidemiological studies, days of treatment (DoT) duration associated with individual electronic drug utilization records (DUR) are usually missing. Researcher-defined duration (RDD) calculation approaches, as opposed to data-driven approaches, can be used to estimate DoT based on the specific choices and assumptions made by investigators. These are usually underreported or even undocumented. We aimed to develop a framework for the standardization of terminology, formulas, implementation, and reporting of possible RDD approaches. MethodsA systematic classification of RDD calculation approaches was developed via expert consensus. Universal concepts used to operationalise RDDs were identified and described using standard terminologies. An open-source R function, CreateDoT, was created to implement the formulas universal concepts as input parameter. A step-by-step workflow was developed to facilitate implementation and reporting. ResultsRDD approaches were classified in two main classes: I) daily dose (DD)-based calculation approaches (n=3 formulas), and II) fixed-duration approaches (n=2). Seven universal concepts were identified to describe the five corresponding generalized formulas for DoT calculation. Input parameters of the CreateDoT function can be retrieved from source data through its mapping to universal concepts, or inputted by the investigator based on the chosen calculation approach. The input file structure itself represents a standard reporting template for documenting investigators assumptions and methodological choices adopted for DoT calculation. ConclusionsThe CreateDoT framework can facilitate the documentation and reporting of RDD approaches for DoT calculation, increasing transparency and reproducibility of pharmacoepidemiological studies regardless of the data model used, and facilitates sensitivity analyses to evaluate the impact of alternative assumptions in DoT calculation.

11
Long-Acting Injectable Buprenorphine Use and Treatment Attribute Priorities Among U.S. Buprenorphine Prescribers: A National Survey

Bormann, N. L.; Arndt, S.; Oesterle, T. S.

2026-02-03 addiction medicine 10.64898/2026.02.01.26345319
Top 0.1%
172× avg
Show abstract

BackgroundLong-acting injectable buprenorphine (LAI-BUP) is safe and effective, however is dramatically underutilized in comparison to oral formulations. Little is known regarding how buprenorphine prescribers view LAI-BUP, and which medication attributes they prioritize when selecting treatment for opioid use disorder (OUD). MethodsA secondary analysis of a national, cross-sectional online survey of U.S. physicians who prescribe buprenorphine for OUD was conducted. Respondents reported OUD caseload, LAI-BUP use, and the importance of medication attributes relevant to treatment selection (e.g., efficacy, safety, ease of administration, ease of prescribing, and administrative requirements). Providers were categorized as no LAI-BUP use or, among LAI-BUP prescribers, Low vs High use based on a median split. Group comparisons used chi-square (or Fishers exact) tests for categorical variables and Jonckheere-Terpstra tests for ordinal responses. ResultsAmong 125 respondents, 39 (31.2%) reported no patients receiving LAI-BUP. The remaining 86 (68.8%) were LAI-BUP prescribers, split evenly into Low and High (ns=43; 34.4%) groups using a median cut of 23.2%. LAI-BUP use did not differ meaningfully by specialty, region, or practice setting. Greater LAI-BUP use was reported by providers with larger OUD panels. Ratings of key medication attributes were uniformly high. ConclusionsLAI-BUP remains underused, with uptake highest among clinicians managing larger OUD caseloads. Measured attitudes toward medication attributes did not explain these differences. Future work should assess clinic workflow, staffing, reimbursement, and REMS burden, testing targeted implementation strategies using mixed-methods trials. Identifying what shifts clinicians from no use to low and high use may guide scalable implementation interventions.

12
Evaluating signals generated in a large-scale sequence symmetry analyses: macrolides and heart failure; and NSAIDs and pneumonia

Sinnott, S.-J.; Lin, K. J.; Wang, S.; Hallas, J.; Desai, R.; Schneeweiss, S.; Gagne, J. J.

2020-03-22 pharmacology and therapeutics 10.1101/2020.03.19.20038596
Top 0.1%
165× avg
Show abstract

ObjectiveUsing US claims data and the most up-to-date pharmacoepidemiological study design tools we aimed to investigate two safety signals for (1) macrolides and heart failure; and (2) non-steroidal anti-inflammatory drugs (NSAIDs) and pneumonia generated from a large-scale screening analysis using a self-controlled sequence symmetry design in Danish data. MethodsWe used IBM Marketscan data to conduct two new-user, active-comparator cohort studies. In the macrolides example, the exposure was clarithromycin or azithromycin and the comparator was amoxicillin/clavulanate, in patients with sinusitis. In the NSAIDs example, the exposure was oral NSAIDs and the comparator was topical diclofenac, in patients with osteoarthritis. We used Cox proportional hazards regression to estimate hazard ratios (HR) and 95% confidence intervals (CI) to adjust for approximately 50 investigator-specified confounders in a propensity score (PS) matched analysis. In a secondary analysis, we used high-dimensional PS (hd-PS) to adjust for 200 additional proxy confounders. ResultsWe had 1,012,364 propensity score matched patients exposed to clarithromycin or azithromycin versus amoxicillin/clavulanate. With 162 outcomes among clarithromycin or azithromycin exposed patients and 134 among amoxicillin/clavulanate, the HR for overall heart failure was 1.14 (95% CI 0.90 - 1.43). In the NSAIDs example, we included 94,490 patients after propensity score matching. With 794 pneumonia outcomes among oral NSAID patients and 700 among topical diclofenac, we found HR 0.98 (95% CI 0.89 - 1.09). Some upward bias was suspected as larger HRs were observed in the days immediately following exposure for both the macrolides and NSAIDs examples. We found similar results in the hd-PS matched analyses for both examples. ConclusionOur findings for NSAIDs and pneumonia suggest the original signal may have been due to protopathic or detection bias. Our analyses for macrolides and heart failure with short-term follow-up also suggest bias, although we encourage further research.

13
High-Throughput Screening for Prescribing Cascades Among Real-World Angiotensin-II Receptor Blockers (ARBs) Initiators

Ndai, A.; Smith, K. M.; Keshwani, S.; Choi, J.; Luvera, M.; Beachy, T.; Calvet, M.; Pepine, C. J.; Schmidt, S.; Vouri, S. M.; Morris, E.; Smith, S. M.

2025-03-11 epidemiology 10.1101/2025.03.10.25323711
Top 0.1%
161× avg
Show abstract

ObjectiveAngiotensin-II Receptor Blockers (ARBs) are commonly prescribed; however, their adverse events may prompt new drug prescription(s), known as prescribing cascades. We aimed to identify potential ARB-induced prescribing cascades using high-throughput sequence symmetry analysis. MethodsUsing claims data from a national sample of Medicare beneficiaries (2011-2020), we identified new ARB users aged [&ge;]66 years with continuous enrollment [&ge;]360 days before and [&ge;]180 days after ARB initiation. We screened for initiation of 446 other (non-antihypertensive) marker drug classes within {+/-}90 days of ARB initiation, generating sequence ratios (SRs) reflecting proportions of ARB users starting the marker class after versus before ARB initiation. Adjusted SRs (aSRs) accounted for prescribing trends over time, and for significant aSRs, we calculated the naturalistic number needed to harm (NNTH); significant signals were reviewed by clinical experts for plausibility. ResultsWe identified 320,663 ARB initiators (mean {+/-} SD age 76.0 {+/-} 7.2 years; 62.5% female; 91.5% with hypertension). Of the 446 marker classes evaluated, 17 signals were significant, and three (18%) were classified as potential prescribing cascades after clinical review. The strongest signals ranked by the lowest NNTH included benzodiazepine derivatives (NNTH 2130, 95% CI 1437-4525), adrenergics in combination with anticholinergics, including triple combinations with corticosteroids (NNTH 2656, 95% CI 1585-10074), and other antianemic preparations (NNTH 9416, 95% CI 6606-23784). The strongest signals ranked by highest aSR included other antianemic preparations (aSR 1.7, 95% CI 1.19-2.41), benzodiazepine derivatives (aSR 1.18, 95% CI 1.08-1.3), and adrenergics in combination with anticholinergics, including triple combinations with corticosteroids (aSR 1.12, 95% CI 1.03-1.22). ConclusionThe identified prescribing cascade signals reflected known and possibly under-recognized ARB adverse events in this Medicare cohort. These hypothesis-generating findings require further investigation to determine the extent and impact of these prescribing cascades on patient outcomes.

14
Adjusting for residual confounding using high-dimensional propensity scores in a study of inhaled corticosteroids and COVID-19 outcomes

Bokern, M.; Tazare, J.; Rentsch, C. T.; Quint, J. K.; Douglas, I. J.; Schultze, A.

2025-02-05 epidemiology 10.1101/2025.02.04.25321459
Top 0.1%
159× avg
Show abstract

In pharmacoepidemiologic studies of COVID-19, there were concerns about bias from residual confounding. We applied high-dimensional propensity scores (HDPS) to a case study investigating the role of inhaled corticosteroids (ICS) in COVID-19 to adjust for unmeasured confounding. We selected patients with chronic obstructive pulmonary disease on 01 March 2020 from Clinical Practice Research Datalink (CPRD) Aurum, comparing ICS/LABA/(+-LAMA) and LABA/LAMA users. ICS effects on the outcomes COVID-19 hospitalisation and death were assessed through weighted and unweighted Cox proportional hazards models. HDPS were estimated from primary care clinical records, prescriptions and hospitalisations. SNOMED-CT codes and dictionary of medicines and devices codes from CPRD Aurum were mapped to International Classification of Disease 10th revision codes and British National Formulary paragraphs respectively. We estimated propensity scores (PS) combining prespecified and HDPS covariates, selecting the top 100, 250, 500, 750 and 1000 covariates ranked by confounding potential. When excluding triple therapy users, the conventional PS-weighted estimates showed weak evidence of increased risk of COVID-19 hospitalisation among ICS users (HR 1.19 (95% CI 0.92-1.54)). Results varied slightly based on the number of covariates included in HDPS (HR using 100 HDPS covariates 1.01 (95% CI 0.76-1.33), HR using 250 HDPS covariates 1.24 (95% CI 0.83-1.87)). For COVID-19 death, conventional PS-weighted models showed weak evidence of harm of ICS when excluding triple therapy users (HR 1.24 (95% CI 0.87-1.75)). HDPS-weighting moved estimates toward the null, suggesting no effect of ICS (HR using 250 HDPS covariates excluding triple therapy 1.08 (95% CI 0.73- 1.59)). HDPS may have provided better confounding control for COVID-19 deaths and may be able to partially compensate for suboptimal comparison groups. HDPS results can be sensitive to the number of covariates included, highlighting the importance of sensitivity analyses. Key pointsO_LIResidual confounding, including residual confounding by indication, is a major concern in pharmacoepidemiologic studies of COVID-19 outcomes. C_LIO_LIWe apply high-dimensional propensity scores (HDPS) to adjust for residual confounding in a case study of inhaled corticosteroids (ICS) on COVID-19 hospitalisation and death in CPRD Aurum. C_LIO_LIConventional PS-weighted analyses suggested harmful effects of ICS on COVID-19 hospitalisation and, to a lesser extent, deaths. C_LIO_LIHDPS weighted analyses of COVID-19 hospitalisations were sensitive to the number of covariates included, with results moving towards the null for smaller number of covariates and away from the null when including more covariates, while for deaths, estimates moved towards the null consistently. C_LIO_LIHDPS demonstrated promise in addressing confounding even when comparison groups are suboptimal, but its performance depends on the careful selection and ranking of covariates. C_LI Plain Language SummaryA key challenge when researching the effects of medications using electronic health records is accounting for the fact that people who receive different medications often differ in important ways. Such differences, called confounding, is typically accounted for using statistical methods which require researchers to pre-specify all important confounders. A newer method, called high-dimensional propensity scores (HDPS), uses a data-driven approach to select what confounders to account for instead. These methods have not yet been applied to studies of inhaled corticosteroids and COVID-19 outcomes, an area where studies have found conflicting findings. We used electronic health records from the UK to compare the risk of COVID-19 hospitalisation and death among patients with chronic obstructive pulmonary disease taking two different treatments (ICS/LABA and LABA/LAMA) using both conventional and HDPS methods. Our findings showed that HDPS can reduce important differences between patients (confounding), but that the results can be sensitive to the number of covariates included. This demonstrates the value of HDPS and the need for researchers to run their analysis using several different assumptions.

15
Rise and Regional Disparities in Buprenorphine Utilization in the United States

Pashmineh, A. A. R.; Cruz-Mullane, A.; Podd, J. C.; Lam, W. S.; Kaleem, S. S.; Lockard, L. B.; Mandel, M. R.; Chung, D. Y.; Davis, C. S.; Nichols, S. D.; McCall, K. L.; Piper, B. J.

2019-09-09 addiction medicine 10.1101/19006163
Top 0.1%
145× avg
Show abstract

AimsBuprenorphine is an opioid partial-agonist used to treat Opioid Use Disorders (OUD). While several state and federal policy changes have attempted to increase buprenorphine availability, access remains well below optimal levels. This study characterized how buprenorphine utilization in the United States has changed over time and whether there are regional disparities in distribution. MeasurementsBuprenorphine weights distributed from 2007 to 2017 were obtained from the Drug Enforcement Administration. Data was expressed as the percent change and as the mg per person in each state. Separately, the formulations for prescriptions covered by Medicaid (2008 to 2018) were examined. FindingsBuprenorphine distributed to pharmacies increased about seven-fold (476.8 to 3,179.9 kg) while the quantities distributed to hospitals grew five-fold (18.6 to 97.6 kg) nationally from 2007 to 2017. Buprenorphine distribution per person was almost 20-fold higher in Vermont (40.4 mg/person) relative to South Dakota (2.1 mg/person). There was a strong association between the number of waivered physicians per 100K population and distribution per state (r(49) = +0.76, p < .0005). The buprenorphine/naloxone sublingual film (Suboxone) was the predominant formulation (92.6% of 0.31 million Medicaid prescriptions) in 2008 but this accounted for less than three-fifths (57.3% of 6.56 million prescriptions) in 2018. ConclusionsAlthough buprenorphine availability has substantially increased over the last decade, distribution was very non-homogenous across the US.

16
Monitoring Report: GLP-1 RA Prescribing Trends - December 2024 Data

Gratzl, S.; Cartwright, B. M. G.; Rodriguez, P. J.; Gilbert, K.; Do, D.; Masters, N. B.; Stucky, N.

2025-03-07 endocrinology 10.1101/2025.03.06.25323524
Top 0.1%
139× avg
Show abstract

BackgroundLimited recent data exist on prescribing patterns and patient characteristics for glucagon-like peptide 1 receptor agonists (GLP-1 RAs), an important drug class used as anti-diabetic medication (ADM) for patients with type 2 diabetes mellitus (T2D) and/or anti-obesity medication (AOM) in patients with overweight or obesity. For brevity, we use the term GLP-1 RA to refer to both GLP-1 RA and dual GLP-1 RA/GIP medications. ObjectiveTo describe recent trends in prescribing and dispensing of GLP-1-based medications in the US. MethodsUsing a subset of real-world electronic health record (EHR) data from Truveta, a growing collective of health systems that provide more than 18% of all daily clinical care in the US, we identified people who were prescribed a GLP-1-based medication between January 01, 2019 and December 31, 2025. We describe prescribing volumes and patient characteristics over time, by medication, and by FDA-labeled use. Among the subset of patients for whom post-prescription dispensing data is available, we describe the proportion and characteristics of patients who were and were not dispensed a GLP-1 RA following their prescription. Results2,185,238 patients were prescribed a GLP-1 RA between January 2019 and December 2025, with 11,194,909 total prescriptions during this period. Among first-time prescriptions for which use could be established, 69.1% were ADMs and 30.9% were AOMs. Overall prescribing rates (GLP-1 RA prescriptions per total prescriptions) increased slightly from September to December 2025 (+5.02%); however, first-time prescribing rates declined over the same period (-6.62%). As of December 2025, GLP-1 RA prescriptions account for more than 7% of all prescriptions.

17
Coronavirus-19 and coagulopathy: A Systematic Review

Lee, S. G.; Fralick, M.; Tang, G.; Tse, B.; Baumann Kreuziger, L.; Cushman, M.; Juni, P.; Sholzberg, M.

2021-01-06 hematology 10.1101/2021.01.05.20248202
Top 0.1%
130× avg
Show abstract

BackgroundUnderstanding the association between Coronavirus Disease 2019 (COVID-19) and coagulopathy may assist clinical prognostication, and influence treatment and outcomes. We aimed to systematically describe the relationship between hemostatic laboratory parameters and important clinical outcomes among adults with COVID-19. MethodsA systematic review of randomized clinical trials, observational studies and case series published in PubMed (Medline), EMBASE, and CENTRAL from December 1, 2019 to March 25, 2020. Studies of adult patients with COVID-19 that reported at least one hemostatic laboratory parameter were included. ResultsData were extracted from 57 studies (N=12,050 patients) that met inclusion criteria. The average age of patients was 52 years and 45% were women. Of the included studies, 92.7% (N=38/41 studies) reported an average platelet count [&ge;] 150 x 109/L, 68.2% (N=15/22 studies) reported an average prothrombin time (PT) between 11-14 s, 55% (N=11/20 studies) reported an average activated partial thromboplastin time (aPTT) between 25-35 s, and 34.4% (N=11/32 studies) reported a D-dimer concentration above the upper limit of normal (ULN). Eight studies (7 cohorts and 1 case series) reported hemostatic lab values for survivors versus non-survivors. Among non-survivors, D-dimer concentrations were reported in 4 studies and all reported an average above the ULN. InterpretationMost patients had a normal platelet count, elevated D-dimer, PT and aPTT values in the upper reference interval; D-dimer elevation appeared to correlate with poor outcomes. Further studies are needed to better correlate these hemostatic parameters with the risk of adverse outcomes such as thrombosis and bleeding.

18
Estimating the gestational age of spontaneous abortions identified via database algorithms: a literature review and empirical analysis in Norwegian register data

Srinivas, C.; Cohen, J. M.

2025-03-20 epidemiology 10.1101/2025.03.20.25324313
Top 0.1%
130× avg
Show abstract

IntroductionSpontaneous abortion is a common pregnancy outcome, but incomplete recording and missing gestational age in health databases pose challenges for research. Accurate timing of the start of pregnancy is critical information in drug safety studies. ObjectivesTo review the literature on database algorithms to estimate gestational length for spontaneous abortions and clinical studies than can inform such algorithms. To estimate the average gestational age for algorithm-identified spontaneous abortions in Norway using interrupted time series analysis. MethodsWe used an algorithm to identify pregnancies registered in Norway from 2010-2020 and restricted to spontaneous abortions identified from registers of primary and specialist care, and births from the Medical Birth Registry of Norway. For births, we calculated the LMP by subtracting the recorded gestational age from the birth date. We assigned spontaneous abortions gestational ages ranging from 7 to 11 weeks and a corresponding LMP. We identified prescriptions from 70 days before to 97 days after LMP and calculated the number of antidepressant prescriptions per 10,000 pregnancies per day. We applied two-sample interrupted time series analysis with intervention points set at 28 and 55 days after LMP and compared antidepressant prescription trends after 28 gestational days for spontaneous abortions versus births. ResultsDatabase algorithms have used estimates for the gestational age at spontaneous abortion ranging from 8-10 weeks, and clinical studies suggest the mean or median gestational age at spontaneous abortion of around 9-10 weeks. In our interrupted time series analysis including 122,495 spontaneous abortions and 631,929 births, the 7-week assumption showed no post-intervention trend, suggesting underestimation. The 9-week assumption closely matched the trend for births (-0.051 prescriptions/day, 95% CI -0.090 to -0.013 vs. -0.056, 95% CI: -0.067 to - 0.046). The 8, 10, and 11-week assumptions showed less precise alignment. The best alignment occurred with the 64-day assumption (9.1 weeks). ConclusionOur study provides an empirically derived estimate for the average gestational age for algorithm-identified spontaneous abortions which can be applied in future research using the same pregnancy algorithm in Norway. While the 64-day estimate seems most accurate for our dataset, further validation studies are necessary to confirm its applicability in other contexts.

19
Data management in substance use disorder treatment research: Implications from data harmonization of National Institute on Drug Abuse-funded randomized controlled trials

Susukida, R.; Aminesmaeili, M.; Mojtabai, R.

2020-05-03 addiction medicine 10.1101/2020.04.28.20081935
Top 0.1%
127× avg
Show abstract

BackgroundSecondary analysis of data from completed randomized controlled trials (RCTs) is a critical and efficient way to maximize the potential benefit from past research. De-identified primary data from completed RCTs have been increasingly available in recent years; however, the lack of standardized data products is a major barrier to further use of these valuable data. Pre-statistical harmonization of data structure, variables and codebooks across RCTs would facilitate secondary data analysis including meta-analysis and comparative effectiveness studies. We describe a data harmonization initiative to harmonize de-identified primary data from substance use disorder (SUD) treatment RCTs funded by the National Institute on Drug Abuse (NIDA) available on the NIDA Data Share website. MethodsHarmonized datasets with standardized data structures, variable names, labels, and definitions and harmonized codebooks were developed for 36 completed RCTs. Common data domains were identified to bundle data files from individual RCTs according to relevant subject areas. Variables within the same instrument were harmonized if at least two RCTs used the same instrument. The structures of the harmonized data were determined based on the feedback from clinical trialists and SUD research experts. ResultsWe have created a harmonized database of variables across 36 RCTs with a build-in label, and a brief definition for each variable. Data files from the RCTs have been consistently categorized into eight domains (enrollment, demographics, adherence, adverse events, physical health measures, mental-behavioral-cognitive health measures, self-reported substance use measures, and biologic substance use measures). Harmonized codebooks and instrument/variable concordance tables have also been developed to help identify instruments and variables of interest more easily. ConclusionsThe harmonized data of RCTs of SUD treatments can potentially promote future secondary data analysis of completed RCTs, allowing combining data from multiple RCTs and provide guidance for future RCTs in SUD treatment research.

20
Meta-Analysis of Risk of Vaccine-Induced Immune Thrombotic Thrombocytopenia Following ChAdOx1-S Recombinant Vaccine

Chan, B. T. B.; Bobos, P.; Odutayo, A.; Pai, M.

2021-05-08 hematology 10.1101/2021.05.04.21256613
Top 0.1%
126× avg
Show abstract

ContextVaccine-induced immune thrombotic thrombocytopenia (VITT) has been reported after administering ChAdOx1-S recombinant COVID-19 vaccine (marketed as Vaxzevira by Astra-Zeneca, Covishield). Estimates of incidence vary between countries, due to different age distributions chosen, case definitions and choice of denominator (persons vaccinated vs immunizations given). This study clarifies these estimates by pooling data from ten countries and examining differences by age group. MethodsWe examined case reports, press releases and immunization data and calculated pooled estimates of VITT incidence using random effects models. Sensitivity analyses considered different combinations of countries and varying assumptions on time between vaccination and reporting of cases. ResultsPooling all countries, VITT incidence was 0.73 per 100,000 persons receiving first dose of Covishield/Vaxzevira [95% CI .43,1.23]. Incidence for age 65 and over was 0.11 per 100,000 persons [95% CI .05-.26], and significantly higher among those under age 55: 1.67 per 100,000 persons [95% CI 1.30-2.14] in the UK, 5.06 per 100,000 persons in Norway [95% CI 2.16, 11.86]. The latter had the best data on counts of persons vaccinated. Incidence for age 55 to 64 years was 0.34 [95% CI 0.13, 0.85] in the UK, lower than for under age 55. ConclusionVITT is a rare vaccine-associated adverse event. Incidence estimates vary between jurisdictions. However, even the highest reported incidence from Norway is low - and in settings with high community transmission, lower than risk of serious outcomes associated with Covid-19. Policymakers and individuals can use these data to calculate risk-benefit ratios and better target vaccine distribution. EssentialsO_LIThis paper measures risk of vaccine-induced immune thrombotic thrombocytopenia (VITT) after ChAdOx1-S recombinant COVID-19 vaccine C_LIO_LIPooled estimates of incidence were calculated with a random effects model based on data from 10 countries C_LIO_LIOverall risk is 1 in 139,000; for age 65 and over, about 1 in 1,000,000; for age under 55, between 1 in 20,000 to 60,000 C_LIO_LIVITT risk is low and varies by age. These data can inform policies around vaccination distribution. C_LI